AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Multi-task pretraining

# Multi-task pretraining

Tibert Base
This is a BERT base model pretrained specifically for Tigrinya, trained for 40 epochs on a dataset of 40 million tokens.
Large Language Model Other
T
fgaim
28
1
Code Trans T5 Large Source Code Summarization Python Multitask Finetune
Pretrained model based on T5-large architecture, specifically designed for Python code summarization tasks with multi-task learning support
Text Generation
C
SEBIS
78
13
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase